Fine-Tuning BERT-Based Pre-Trained Models for Arabic Dependency Parsing

نویسندگان

چکیده

With the advent of pre-trained language models, many natural processing tasks in various languages have achieved great success. Although some research has been conducted on fine-tuning BERT-based models for syntactic parsing, and several Arabic developed, no attention paid to dependency parsing. In this study, we attempt fill gap compare nine strategies, encoding methods We evaluated three treebanks highlight best options capture dependencies data. Our exploratory results show that AraBERTv2 model provides scores all confirm higher layers is required. However, adding additional neural network those drops accuracy. Additionally, found differences techniques give highest scores. The analysis errors obtained by test examples highlights four issues an important effect results: parse tree post-processing, contextualized embeddings, erroneous tokenization, annotation. This study reveals a direction future achieve enhanced

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Utilizing Dependency Language Models for Graph-based Dependency Parsing Models

Most previous graph-based parsing models increase decoding complexity when they use high-order features due to exact-inference decoding. In this paper, we present an approach to enriching high-order feature representations for graph-based dependency parsing models using a dependency language model and beam search. The dependency language model is built on a large-amount of additional autoparsed...

متن کامل

Dependency Language Models for Transition-based Dependency Parsing

In this paper, we present an approach to improve the accuracy of a strong transition-based dependency parser by exploiting dependency language models that are extracted from a large parsed corpus. We integrated a small number of features based on the dependency language models into the parser. To demonstrate the effectiveness of the proposed approach, we evaluate our parser on standard English ...

متن کامل

Tuning DeSR for Dependency Parsing of Italian

DeSR is a statistical transition-based dependency parser that learns from a training corpus suitable actions to take in order to build a parse tree while scanning a sentence. DeSR can be configured to use different feature models and classifier types. We tuned the parser for the Evalita 2011 corpora by performing several experiments of feature selection and also by adding some new features. The...

متن کامل

Probabilistic Models for Action-Based Chinese Dependency Parsing

Action-based dependency parsing, also known as deterministic dependency parsing, has often been regarded as a time efficient parsing algorithm while its parsing accuracy is a little lower than the best results reported by more complex parsing models. In this paper, we compare actionbased dependency parsers with complex parsing methods such as all-pairs parsers on Penn Chinese Treebank. For Chin...

متن کامل

Probabilistic Parsing Action Models for Multi-Lingual Dependency Parsing

Deterministic dependency parsers use parsing actions to construct dependencies. These parsers do not compute the probability of the whole dependency tree. They only determine parsing actions stepwisely by a trained classifier. To globally model parsing actions of all steps that are taken on the input sentence, we propose two kinds of probabilistic parsing action models that can compute the prob...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Applied sciences

سال: 2023

ISSN: ['2076-3417']

DOI: https://doi.org/10.3390/app13074225